Transfer Learning by Discovering Latent Task Parametrizations
نویسندگان
چکیده
We present a framework that is able to discover the latent factors that parametrize a family of related tasks from data. The resulting model is able to rapidly identify the dynamics of a new task instance, allowing an agent to flexibly adapt to task variations.
منابع مشابه
Hidden Parameter Markov Decision Processes: A Semiparametric Regression Approach for Discovering Latent Task Parametrizations
Control applications often feature tasks with similar, but not identical, dynamics. We introduce the Hidden Parameter Markov Decision Process (HiP-MDP), a framework that parametrizes a family of related dynamical systems with a low-dimensional set of latent factors, and introduce a semiparametric regression approach for learning its structure from data. We show that a learned HiP-MDP rapidly id...
متن کاملBayesian inference with posterior regularization and applications to infinite latent SVMs
Existing Bayesian models, especially nonparametric Bayesian methods, rely on specially conceived priors to incorporate domain knowledge for discovering improved latent representations. While priors affect posterior distributions through Bayes’ rule, imposing posterior regularization is arguably more direct and in some cases more natural and general. In this paper, we present regularized Bayesia...
متن کاملClosing the Loop with Quantitative Cognitive Task Analysis
Many educational data mining studies have explored methods for discovering cognitive models and have emphasized improving prediction accuracy. Too few studies have “closed the loop” by applying discovered models toward improving instruction and testing whether proposed improvements achieve higher student outcomes. We claim that such application is more effective when interpretable, explanatory ...
متن کاملInfinite Latent SVM for Classification and Multi-task Learning
Unlike existing nonparametric Bayesian models, which rely solely on specially conceived priors to incorporate domain knowledge for discovering improved latent representations, we study nonparametric Bayesian inference with regularization on the desired posterior distributions. While priors can indirectly affect posterior distributions through Bayes’ theorem, imposing posterior regularization is...
متن کاملRegularized Bayesian Inference and Infinite Latent SVMs Bayesian Inference with Posterior Regularization and applications to Infinite Latent SVMs
Existing Bayesian models, especially nonparametric Bayesian methods, rely on specially conceived priors to incorporate domain knowledge for discovering improved latent representations. While priors can affect posterior distributions through Bayes’ theorem, imposing posterior regularization is arguably more direct and in some cases can be more natural and easier. In this paper, we present regula...
متن کامل